Channel: Learn By Watch
Category: Education
Tags: naive bayes classifier pythonalgorithm naive bayesbayes theorem probabilitybayes theorem proofmachine learning classificationnaive bayes classifiernaive bayes algorithmclassification modelbayes theorem machine learningclassification theoremnaive bayesmachine learning tutorialmachine learning lecturesclassification in machine learningmachine learning algorithmsnaive algorithmnaive bayes algorithm in machine learning
Description: Naïve Bayes classifiers are highly scalable, requiring a number of parameters linear in the number of variables (features/predictors) in a learning problem. Here we look at another commonly used classification problem. ● Conditional Probability - Revised some probability formulas which will be helpful in understanding naive bayes. ● Implementing conditional probability - Updated the above formulas so that it can be implemented in our model. ● Example dataset - Looked at the tennis match dataset and looked at some statistics related to it. We would be using only three features for our model. ● Looked at what we are given for today’s condition and what we want to predict ● Applied bayes theorem to find the probability of the outcome and finally normalised the probability so that it makes better sense.